Big O (O(n))
An algorithm is said to run in time T(N) = O(f(N)) if there are constants c and no such that T(N) ≤ cf(N) when N ≥ no.
When we say that T(N) = O(f(N)), we are guaranteeing that the function T(N) grows at a rate no faster than f(N); thus f(N) is an upper bound on T(N)
Example:
If g(x) = x2 + x1.5 + 10.
Then g(x) = O(x2)
Omega (Ω(n))
An algorithm is said to run in time T(N) = Ω(f(N)) if there are constants c and no such that T(N) ≥ cf(N) when N ≥ no. Thus f(N) is a lower bound on T(N)
Example:
If g(x) = x2 + x1.5 + 10.
Then g(x) = Ω(x1.5)
Theta ( Θ(n) )
An algorithm is said to run in time T(N) = Θ(h(N)) if and only if T(N) = O(h(N)) and T(N) = Ω(h(N)).
Example:
If g(x) = log(n100) = 100log(n).
Then g(x) = Θ(log(n))
Note that the 100 is a constant and is therefore no included in the O